Patty Vest: Welcome to Sagecast, the podcast of Pomona College. I'm Patty Vest. Mark Wood: And I'm Mark Wood. Patty Vest: In these extraordinary times, we're coming to you from our various homes, as we all shelter in place. Mark Wood: This season on Sagecast, we're talking to Pomona faculty and alumni about the personal, professional and intellectual journeys that have brought them to where they are today. Patty Vest: Today, we're talking with Assistant Professor of Computer Science, Eleanor Birrell, whose primary research interests are insecurity and data privacy. Mark Wood: Welcome, Eleanor. Eleanor Birrell: Thank you. Pleasure to be here. Mark Wood: So, I understand that you just came from a class. So how are you and your students adjusting to this life in the time of coronavirus? Eleanor Birrell: It's definitely been an adjustment. I've been trying a lot of experiments, trying different things, trying to see what's working and what's not working. And I'm really fortunate that my students keep telling me what works, what doesn't work, what I should try, what I should not try again. So, I've been playing with a couple of different things, I've been flipping the classroom. So I've been recording video lectures and then using class time, once a week, to meet with the students in smaller groups, groups of about 12 students, and go over practice problems and question and answer, and get them set up for the assignment for that week. Patty Vest: Okay. Eleanor, tell us a little bit about your early years. Were you a child of the digital age, or did you discover your love for computer science later in life? Eleanor Birrell: I guess, I always had computers around. I grew up in the Bay area, so I certainly knew people, had friends whose parents were working in the tech industry. I never thought I would be in computer science. In fact, I intentionally wanted to do something different. I went to college, [inaudible 00:01:54] California, tried all sorts of different things, started doing some math, had a lot of fun doing pure mathematics for a while, took an interesting theory of computation course, an interesting cryptography course and discovered they were way cooler than pure math. Wanted to take algorithms, but that had a programming prereq, so took intro to computer science pretty late and just kept falling in love with it. So I'm still here, still doing that. Mark Wood: So you went to two pretty good institutions, Harvard and Cornell. So can you tell us a little bit about your time there, and did you find any special mentors along the way? Eleanor Birrell: A number, yes. I have been very fortunate in the institutions I've been affiliated with. Going to Harvard and having a group of very intelligent, very hardworking, wonderful people to be around, both students and faculty, it had some of the advantages of a liberal arts school where you would get to know your classmates, you'd be taking classes with a lot of the same people, you got to spend time with professors and do research with professors, but it also had the larger selection of course offerings that you can't always offer at a smaller school like Pomona. Eleanor Birrell: And was fortunate to have some great academic mentors and research mentors and ended up having a lot of fun doing research as an undergraduate, went to grad school, worked with a few different people in grad school, ended up coming out with a PhD, and along the way started teaching. It was not really an intentional thing, it was a job advertisement in the math department when I was doing math courses, and started doing more teaching and enjoying it and came out of grad school realizing I wanted to find an institution that valued both of my interests, both teaching and research. And I was very lucky to end up here. Patty Vest: So that leads us perfectly into our next question. How did you end up pursuing an academic career? But how come you decided to pursue an academic career instead of going into a for-profit world, and computer science, or in tech? Eleanor Birrell: Fundamentally, it came down to the fact that I find working with students very rewarding, and teaching students very rewarding. And I wanted to be somewhere where that was not just something I could do, but that's something that was a priority for everyone around me, and having time to teach, having a wonderful group of students to work with, but also having wonderful group of faculty in my department, who I can discuss teaching with and bounce ideas off of is a really great environment to be working in. Mark Wood: So your research is in an area that we've all become very conscious of sometimes the hard way, data security and privacy. What drew you to that field of study, in particular that specialization? Eleanor Birrell: I guess, I got there one step at a time. I mentioned I took a cryptography course early on when I was an undergraduate, and started doing research and cryptography both as an undergraduate, and in my first couple of years in the PhD program. Security's more interesting, it has the same sort of technical, fascinating puzzles and problems with the real world motivation and that so what factor that I find really motivating to work on, and the fact that the problems that I'm working on can tie into current events, can be very inspiring. So you can look at things like, how would you enforce a system that specifies and actually guarantees restrictions on how personal data can be used. And this is a problem I've been working on for the last few years now. I had a paper published last October, I think, on how to do use-based privacy enforcement for location data, and suddenly seeing the potential implications of that sort of work can be a really nice thing to see and keep inspiring you to work on new puzzles. Patty Vest: How do you approach your research? How do you tackle these very, very relevant and timely issues? Eleanor Birrell: I try to keep an eye on what's going on and draw inspiration from that. So one of the projects I'm starting up at the moment is looking at the California data protection regulation, that it went into effect in January this year and how that's being implemented, and how that's being enforced, and how those notifications are being used more broadly. Mark Wood: So you mentioned use-based privacy. You said the current system, which is, I guess, called notice and consent is inadequate to protect our privacy. How does the current system work, and why you say it's not enough, and where do you think we should go? Eleanor Birrell: This is what I'm noticing, consent is fundamentally the idea that on a philosophical level, if someone knows what you're going to do and they say, "That's okay," Then there's been no privacy problem. Which sounds fine in theory, until you look at how it's implemented, and how it's implemented is that there are long legalese documents available on a website somewhere, and if you use the service, you've agreed to everything in that document. This doesn't actually constitute informed consent. People don't read these documents, if they try and read them, they're very confused. And they get to be hard to interpret even for an expert. And even if you are an expert, it can take hours and hours of your life to try and read all of these documents. So people are not providing informed consent. And instead, people are just by default, having to accept whatever the terms are without any power to negotiate those terms. Eleanor Birrell: Fundamentally, the problem boils down to the fact that you either are in, or you're out. You use the service and agree to the terms, or you don't use the service. And in many cases, not using the service is not a practical choice. So the idea of use-based privacy is instead to try and codify societal norms about what reasonable behavior is and what reasonable data use is, and allow data to be used in this particular ways across the board. And that would be a more uniform notion of what privacy is rather than saying privacy is only things that you've not agreed to. Patty Vest: How would we go about enforcing a privacy system and such what it feels like a free willing world as a cyberspace? Eleanor Birrell: Fundamentally, it would require regulatory changes. That sort of thing can happen within a community, it can happen if companies voluntarily decide that there are some standards within this industry and they're going to comply with these standards, or it can happen externally. But fundamentally, if we want to have absolute guarantees for how data are used, that can't always be made at the individual level. Mark Wood: So tell us a little more about use-based privacy again, the thing you've been talking about. What should the average user know about it? And if it requires some kind of regulatory change, why should companies give up what is a much easier process, and one that gives them more of the data they want without having to negotiate? Eleanor Birrell: The current regime gives them the data they want, if people continue to use the services. But you're currently seeing limitations on what data people disclose. There are very few applications that are able to use medical records, for example, because people are much more conscious about that. Same tends to occur with banking data. And there are lots of potential applications of this data, and we're seeing that now with the Covid and through the discussion around the location tracking services, that if you can tie into more data, you can do more things. But in order to tie into more data, you need people to agree to share their data. And I fundamentally think people would be willing to share more data if there are some minimum guarantees on how that data will be used. Patty Vest: Talk to us a little bit about that. Because a lot of states are considering that, and a lot of the tech companies are being called on to... It's been deemed where some people are so important to continue to slow the spread of the virus, but a lot of people have concerns about their privacy. What would be a suggestion you have for something like that to work? Eleanor Birrell: And there's been a lot of discussion about this exact question, because again, this is an example of somewhere where the upsides of having access to the data, and analyze the data are very self-evident, but there are real privacy concerns. And there are multiple approaches that people are considering, but one that seems promising is the idea of using proximity data instead of location data, that if what you want to do is detect who you come into contact with, or how long you're in contact with people, or how close you are to them, that doesn't require the actual location. You don't need the GPS data for that. You could use something like Bluetooth, that detects proximity instead. And this idea of using less data to get more utility might be a promising balance that would compromise between the different options. Mark Wood: I mean, you mentioned these long, legalistic documents that were expected, and that nobody looks at. I have scrolled down, and sometimes they've required you to scroll down to the bottom of them and click something as if you've actually read it, and nobody does. It's so complex, I think a lot of people just go numb to the whole thing, and they're not really admitting to themselves that they're giving up their privacy, they're just closing that off in their mind. And it's like that, "I don't have any choice, so this is what I'm going to do, and I'll hope for the best." I mean, is there a minimum level of awareness that we should all aspire to? About the sort of thing, is there a way to not get so lost in the details that we just throw up our hands? Eleanor Birrell: I think it's helpful if people have some awareness of common practices and least about the capabilities that some of these companies have, that they can determine your location, they can track your behavior across different websites, but I don't think it needs to be the responsibility or should be the responsibility of the individual to monitor what every company is doing with their data. Patty Vest: Eleanor, you're also interested in technology assisted teaching, and that's something that a lot of people are interested right now, just because of the times that we're living in. What part of this area do you focus on? Eleanor Birrell: Over the last number of years, I've been fortunate to be involved with these elite institutions that have really wonderful groups of students and colleagues. But working in these institutions necessarily makes you realize that the people who have access to my work and my teaching are very self-selected already. They need to have passed an extraordinarily high bar just to get in the door. And so, one of the projects I've been involved with is I've been working on creating an online security course through eCornell, that would be accessible to a broader community. Mark Wood: Again, let's pursue that a little bit just philosophically. I mean, institutions like Pomona, and I'm sure, a lot of its peers haven't looked at the idea of distance learning before with much interest, let's say, partly because the product they sell is this close relationship with faculty and very intimate education. The professor on one end of the law, again, the student on the other. Do you think this crisis is going to change the way institutions like Pomona think about those things, and the way faculty at a place like this think about them? Eleanor Birrell: Only in a limited sense. I think a lot of what's becoming obvious having done half a semester online now is the strengths of the residential college and these small classrooms, and the one-on-one time with students, and definitely moving to an asynchronous format, where you don't have the interaction with a professor or moving to a larger form factor where you have hundreds of thousands of students is not going to provide the same service. And I don't think of it as a competition or a replacement, I think it's becoming increasingly, universally recognized that this is not an equivalent or a replacement or something with the same experience. But I still think online courses can be more than zero. And if that's the baseline you're starting from, then you can provide something that's an improvement. Patty Vest: Eleanor, you mentioned that you were just coming from one of your classes, right before this recording, what classes are you teaching this spring, and what are some of the classes that you teach? Eleanor Birrell: So, this spring I'm teaching the computer systems close, which is the upper level course that covers low level computer design and implementation. How computers actually store data, how computers actually execute instructions, how the hardware interacts with the software to build up things like operating systems, which we all use every day, and how those obstructions that support higher level programs actually work, how they impact performance, correctness and insecurity. So that's the one I'm teaching this semester. I like to think of that course as everything you ever wanted to know about computers, but didn't know you wanted to know. [inaudible 00:17:11] should be timed to all of the stuff we see, and we experienced when we use computers every day. Mark Wood: You convinced me, I want to take the course. Eleanor Birrell: It does have a three semester prerequisite series though. Mark Wood: Unfortunately. Patty Vest: Of course, it's enough for level course, Mark. Eleanor Birrell: In addition to that course, I also I'm part of a group of faculty who co-teach the introduction to computer science course. And I also teach an upper level elective on system security. Mark Wood: So let's move over to your research again, a little bit. What are you doing now? Do you have some research projects underway, or do you have some in mind that you want to get moving on? Eleanor Birrell: Yes. I have a couple of projects that are in various stages of progress at the moment. Some of them with Pomona students, some of them with external collaborators. One of the projects I was working on with a student was on designing and implementing visualization tools for these privacy policies. So then, graphical replacement for the text-base details that would hopefully be an intermediate level that would allow users to explore and get some intuition for the contents of these policies without having to read through all of the details. I have another project that's looking at what factors cause people to trust different applications, and how do those different signals influence assumptions about data collection practices. So both of those are in the area that we generally refer to as usable security and privacy. On the other end of the spectrum, I also have ongoing work on this idea of building and implementing use-based privacy systems. We're particularly looking at machine learning applications in the context of location data. Mark Wood: So you mentioned what causes people to trust certain applications, how do you study that, and what preconceptions do you go into it with, what ideas are you testing? Eleanor Birrell: You start by drawing on your own experience. What makes you trust an application? Do you consider whether it has advertisement? Do you consider whether it is a free app or a paid app, do you consider how popular it is or how many downloads it has, or what the reviews are? And then you start tweaking those factors and showing different populations of users, slightly different versions of this application with one of those variables switched. And then you can ask them questions about their assumptions or their experience, how much they would trust it, what data they think would be collected, and you can draw inferences about what sorts of factors are influencing people's decisions. Patty Vest: Eleanor, you mentioned that some students are involved in your research, what roles do they play, and how are they involved in your projects? Eleanor Birrell: The students are involved at all levels of this. So for the project on looking at factors that influence trust and assessment, I have students involved who are collaborating with me on designing the studies, and hypothesizing, and brainstorming the factors that we'll want to consider, and analyzing the data, and writing up the paper. I'm working on that with a student here and with a collaborator and some students at Wellesley as well. And so that's a project where we're all working together on all of the different components of it. Eleanor Birrell: The project on visualizing privacy policies, this was an idea that was really driven by one of my thesis students. And he came up with most of the details of the design. He was implementing it, and I was playing a much more advisory role in that project. So, I work with students at all different levels, some of them have just finished their first year, just finished the intro course, some of them are seniors and have a better idea of exactly what they want to work on and how they want to approach the problem. Mark Wood: So I assume that you're able to continue that kind of work remotely. I guess, in some ways, computer science is one of the fuels that's a little friendlier to that kind of remote work than some others. How do you work with your students during this? Eleanor Birrell: Yes, I definitely do a lot of work remotely with collaborators, and I have done for years. We meet regularly during, the school year, that's usually about once a week, over the summer, we tend to meet once a day. We brainstorm things, we put together concrete plans for what we're going to do for the rest of the day, and then continue iterating and repeating. Since a lot of projects involve implementing code and analyzing data, all of those projects can be done remotely. Patty Vest: There's an increase in interest in computer science from college-bound students. As a professor of computer science, why do you think that is? Eleanor Birrell: I think it's very natural that if you're surrounded by these devices, that you might want to understand them, how they work, how they interact with society, how they should interact with society. There's a classic quote that, "Any sufficiently advanced technology is indistinguishable from magic." And most of us think of it as walking around with these magic boxes in our pockets that do stuff. But humans like to understand things, and having the opportunity to understand these things and make them no longer magic, but actually something that you can control and design, and build for yourself is certainly something that resonated with me. And it seems natural to me that would resonate with many incoming students. They're growing up in a digital generation where they've had computers and smartphones around them their entire lives, and that's a natural thing to want to study. Mark Wood: So, tell us about the computer science department here at Pomona, what's special about it? Eleanor Birrell: There are nine of us, we work together very closely. It's a great group of people with diverse interests and a common agenda of wanting to teach our students well and play with our research and really build a community for everyone who is involved with computer science, whether that's taking one course, or a distribution requirement, or becoming a major, or going on with it after they graduate. Patty Vest: You've also spoken about the importance of increasing diversity in a field like computer science, what can be done to solve this problem? Eleanor Birrell: I think the biggest challenge for computer science has often been getting people into the door in the first place. And if we can get students to take that first course, they often discover, "Maybe this is something I like, maybe this is something I can do, maybe this is me after all." And once you get enough people in that room, the room starts looking very different, and I start feeling more comfortable there, and I hope my students start feeling more comfortable there, when it looks like a more representative sample of our broader community. Mark Wood: So you mentioned building community within the department, how do you go about that? Eleanor Birrell: There are a number of ways that we go about that. Some of the ones I'm most actively involved in are having a community around the lab, where people often work on their assignments in the same room in the department, rather than working on their assignments in their room. And that means you're not doing the stereotypical coding on your own in the dark, you're programming with a bunch of friends and a bunch of colleagues, and the core staff, and the mentors, and the TAs, and that feels like a very social experience. And it's generally a lot more fun. Patty Vest: You talked about the role your students have in your research projects and some of them working on their thesis, what kinds of things do your students go on to do after Pomona? Eleanor Birrell: Whatever they want to do. Many of our students go on into industry. Some in tech companies, sometimes in more diverse fields, as we're increasingly seeing computing is applicable to all sorts of areas and all sorts of businesses. So we have students working across a variety of fields, some of them in design work, some of them in programming work, whatever it is that they want to do. Patty Vest: What are some advice that you have for incoming students that are thinking about computer science? Eleanor Birrell: Try computer science and also try something else. Mark Wood: Okay, thanks. On that note, we're going to need to wrap this up. We've been talking about digital security and computer science in general with Assistant Professor of Computer Science, Eleanor Birrell. Thanks Eleanor. Eleanor Birrell: Thank you for having me. Patty Vest: And to all who stuck with us this far, thanks for listening to the Sagecast, the podcast of Pomona College. Stay safe, and until next time.